# Lightweight and Efficient

Arrowneo AME 4x3B V0.1 MoE
MIT
A Mixture of Experts model designed to serve as the soul of AI virtual anchors, combining code generation, instruction following, and multi-turn dialogue capabilities
Large Language Model Supports Multiple Languages
A
DataPilot
51
3
Deepseek V3 0324 GGUF
MIT
GGUF quantized version of DeepSeek-V3-0324, suitable for local text generation tasks.
Large Language Model
D
MaziyarPanahi
97.25k
19
PEGASUS Medium
MIT
PEGASUS Medium is a fine-tuned version of the PEGASUS model, specifically optimized for abstractive text summarization tasks on Indonesian news articles.
Text Generation Other
P
fatihfauzan26
87
1
Depth Anything V2 Base
Depth Anything V2 is currently the most powerful monocular depth estimation (MDE) model, trained on 595,000 synthetically annotated images and over 62 million real unannotated images.
3D Vision English
D
depth-anything
66.95k
17
Chinese Electra Base Generator
Apache-2.0
Chinese ELECTRA is a pre-trained model developed by the Harbin Institute of Technology-iFLYTEK Joint Lab (HFL) based on the ELECTRA model released by Google and Stanford University. It features a small parameter size and high performance.
Large Language Model Transformers Chinese
C
hfl
15
0
Legal Bert Small Uncased
A lightweight BERT model specialized for the legal domain, with only 33% the size of BERT-BASE, significantly improving efficiency while maintaining performance
Large Language Model English
L
nlpaueb
11.99k
21
Chinese Electra Small Generator
Apache-2.0
Chinese ELECTRA is a pre-trained model developed by the Harbin Institute of Technology-iFLYTEK Joint Lab based on Google's ELECTRA architecture, with only 1/10 the parameters of BERT but comparable performance.
Large Language Model Transformers Chinese
C
hfl
16
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase